Optimality conditions for maximizers of the information divergence from an exponential family

نویسنده

  • Frantisek Matús
چکیده

The information divergence of a probability measure P from an exponential family E over a nite set is de ned as in mum of the divergences of P from Q subject to Q ∈ E . All directional derivatives of the divergence from E are explicitly found. To this end, behaviour of the conjugate of a log-Laplace transform on the boundary of its domain is analysed. The rst order conditions for P to be a maximizer of the divergence from E are presented, including new ones when P is not projectable to E .

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

On maximization of the information divergence from an exponential family

The information divergence of a probability measure P from an exponential family E over a nite set is deened as innmum of the divergences of P from Q subject to Q in E. For convex exponential families the local maximizers of this function of P are found. General exponential family E of dimension d is enlarged to an exponential family E of the dimension at most 3d + 2 such that the local maximiz...

متن کامل

Maximizing multi-information

Stochastic interdependence of a probablility distribution on a product space is measured by its Kullback-Leibler distance from the exponential family of product distributions (called multi-information). Here we investigate lowdimensional exponential families that contain the maximizers of stochastic interdependence in their closure. Based on a detailed description of the structure of probablili...

متن کامل

Maximizing Multi - Information Nihat Ay And

We investigate the structure of the global maximizers of stochas-tic interdependence, which is measured by the Kullback-Leibler divergence of the underlying joint probability distribution from the exponential family of factorizable random fields (multi-information). As a consequence of our structure results, it comes out that random fields with globally maximal multi-information are contained i...

متن کامل

Min-Max Kullback-Leibler Model Selection

This paper considers an information theoretic min-max approach to the model selection problem. The aim of this approach is to select the member of a given parameterized family of probability models so as to minimize the worst-case KullbackLeibler divergence from an uncertain “truth” model. Uncertainty of the truth is specified by an upper-bound of the KL-divergence relative to a given reference...

متن کامل

Maximizing the Divergence from a Hierarchical Model of Quantum States

We study many-party correlations quantified in terms of the Umegaki relative entropy (divergence) from a Gibbs family known as a hierarchical model. We derive these quantities from the maximum-entropy principle which was used earlier to define the closely related irreducible correlation. We point out differences between quantum states and probability vectors which exist in hierarchical models, ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • Kybernetika

دوره 43  شماره 

صفحات  -

تاریخ انتشار 2007